In [1]:
!pip install nbconvert
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: nbconvert in /usr/local/lib/python3.7/dist-packages (5.6.1)
Requirement already satisfied: pandocfilters>=1.4.1 in /usr/local/lib/python3.7/dist-packages (from nbconvert) (1.5.0)
Requirement already satisfied: jupyter-core in /usr/local/lib/python3.7/dist-packages (from nbconvert) (4.10.0)
Requirement already satisfied: pygments in /usr/local/lib/python3.7/dist-packages (from nbconvert) (2.6.1)
Requirement already satisfied: jinja2>=2.4 in /usr/local/lib/python3.7/dist-packages (from nbconvert) (2.11.3)
Requirement already satisfied: traitlets>=4.2 in /usr/local/lib/python3.7/dist-packages (from nbconvert) (5.1.1)
Requirement already satisfied: nbformat>=4.4 in /usr/local/lib/python3.7/dist-packages (from nbconvert) (5.4.0)
Requirement already satisfied: defusedxml in /usr/local/lib/python3.7/dist-packages (from nbconvert) (0.7.1)
Requirement already satisfied: mistune<2,>=0.8.1 in /usr/local/lib/python3.7/dist-packages (from nbconvert) (0.8.4)
Requirement already satisfied: testpath in /usr/local/lib/python3.7/dist-packages (from nbconvert) (0.6.0)
Requirement already satisfied: bleach in /usr/local/lib/python3.7/dist-packages (from nbconvert) (5.0.0)
Requirement already satisfied: entrypoints>=0.2.2 in /usr/local/lib/python3.7/dist-packages (from nbconvert) (0.4)
Requirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.7/dist-packages (from jinja2>=2.4->nbconvert) (2.0.1)
Requirement already satisfied: fastjsonschema in /usr/local/lib/python3.7/dist-packages (from nbformat>=4.4->nbconvert) (2.15.3)
Requirement already satisfied: jsonschema>=2.6 in /usr/local/lib/python3.7/dist-packages (from nbformat>=4.4->nbconvert) (4.3.3)
Requirement already satisfied: attrs>=17.4.0 in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat>=4.4->nbconvert) (21.4.0)
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat>=4.4->nbconvert) (4.2.0)
Requirement already satisfied: importlib-resources>=1.4.0 in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat>=4.4->nbconvert) (5.7.1)
Requirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat>=4.4->nbconvert) (0.18.1)
Requirement already satisfied: importlib-metadata in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat>=4.4->nbconvert) (4.11.4)
Requirement already satisfied: zipp>=3.1.0 in /usr/local/lib/python3.7/dist-packages (from importlib-resources>=1.4.0->jsonschema>=2.6->nbformat>=4.4->nbconvert) (3.8.0)
Requirement already satisfied: six>=1.9.0 in /usr/local/lib/python3.7/dist-packages (from bleach->nbconvert) (1.15.0)
Requirement already satisfied: webencodings in /usr/local/lib/python3.7/dist-packages (from bleach->nbconvert) (0.5.1)
In [2]:
!jupyter nbconvert --to html /content/project.ipynb
[NbConvertApp] WARNING | pattern '/content/project.ipynb' matched no files
This application is used to convert notebook files (*.ipynb)
        to various other formats.

        WARNING: THE COMMANDLINE INTERFACE MAY CHANGE IN FUTURE RELEASES.

Options
=======
The options below are convenience aliases to configurable class-options,
as listed in the "Equivalent to" description-line of the aliases.
To see all configurable class-options for some <cmd>, use:
    <cmd> --help-all

--debug
    set log level to logging.DEBUG (maximize logging output)
    Equivalent to: [--Application.log_level=10]
--show-config
    Show the application's configuration (human-readable format)
    Equivalent to: [--Application.show_config=True]
--show-config-json
    Show the application's configuration (json format)
    Equivalent to: [--Application.show_config_json=True]
--generate-config
    generate default config file
    Equivalent to: [--JupyterApp.generate_config=True]
-y
    Answer yes to any questions instead of prompting.
    Equivalent to: [--JupyterApp.answer_yes=True]
--execute
    Execute the notebook prior to export.
    Equivalent to: [--ExecutePreprocessor.enabled=True]
--allow-errors
    Continue notebook execution even if one of the cells throws an error and include the error message in the cell output (the default behaviour is to abort conversion). This flag is only relevant if '--execute' was specified, too.
    Equivalent to: [--ExecutePreprocessor.allow_errors=True]
--stdin
    read a single notebook file from stdin. Write the resulting notebook with default basename 'notebook.*'
    Equivalent to: [--NbConvertApp.from_stdin=True]
--stdout
    Write notebook output to stdout instead of files.
    Equivalent to: [--NbConvertApp.writer_class=StdoutWriter]
--inplace
    Run nbconvert in place, overwriting the existing notebook (only 
            relevant when converting to notebook format)
    Equivalent to: [--NbConvertApp.use_output_suffix=False --NbConvertApp.export_format=notebook --FilesWriter.build_directory=]
--clear-output
    Clear output of current file and save in place, 
            overwriting the existing notebook.
    Equivalent to: [--NbConvertApp.use_output_suffix=False --NbConvertApp.export_format=notebook --FilesWriter.build_directory= --ClearOutputPreprocessor.enabled=True]
--no-prompt
    Exclude input and output prompts from converted document.
    Equivalent to: [--TemplateExporter.exclude_input_prompt=True --TemplateExporter.exclude_output_prompt=True]
--no-input
    Exclude input cells and output prompts from converted document. 
            This mode is ideal for generating code-free reports.
    Equivalent to: [--TemplateExporter.exclude_output_prompt=True --TemplateExporter.exclude_input=True]
--log-level=<Enum>
    Set the log level by value or name.
    Choices: any of [0, 10, 20, 30, 40, 50, 'DEBUG', 'INFO', 'WARN', 'ERROR', 'CRITICAL']
    Default: 30
    Equivalent to: [--Application.log_level]
--config=<Unicode>
    Full path of a config file.
    Default: ''
    Equivalent to: [--JupyterApp.config_file]
--to=<Unicode>
    The export format to be used, either one of the built-in formats
            ['asciidoc', 'custom', 'html', 'latex', 'markdown', 'notebook', 'pdf', 'python', 'rst', 'script', 'slides']
            or a dotted object name that represents the import path for an
            `Exporter` class
    Default: 'html'
    Equivalent to: [--NbConvertApp.export_format]
--template=<Unicode>
    Name of the template file to use
    Default: ''
    Equivalent to: [--TemplateExporter.template_file]
--writer=<DottedObjectName>
    Writer class used to write the 
                                        results of the conversion
    Default: 'FilesWriter'
    Equivalent to: [--NbConvertApp.writer_class]
--post=<DottedOrNone>
    PostProcessor class used to write the
                                        results of the conversion
    Default: ''
    Equivalent to: [--NbConvertApp.postprocessor_class]
--output=<Unicode>
    overwrite base name use for output files.
                can only be used when converting one notebook at a time.
    Default: ''
    Equivalent to: [--NbConvertApp.output_base]
--output-dir=<Unicode>
    Directory to write output(s) to. Defaults
                                  to output to the directory of each notebook. To recover
                                  previous default behaviour (outputting to the current 
                                  working directory) use . as the flag value.
    Default: ''
    Equivalent to: [--FilesWriter.build_directory]
--reveal-prefix=<Unicode>
    The URL prefix for reveal.js (version 3.x).
            This defaults to the reveal CDN, but can be any url pointing to a copy 
            of reveal.js. 
            For speaker notes to work, this must be a relative path to a local 
            copy of reveal.js: e.g., "reveal.js".
            If a relative path is given, it must be a subdirectory of the
            current directory (from which the server is run).
            See the usage documentation
            (https://nbconvert.readthedocs.io/en/latest/usage.html#reveal-js-html-slideshow)
            for more details.
    Default: ''
    Equivalent to: [--SlidesExporter.reveal_url_prefix]
--nbformat=<Enum>
    The nbformat version to write.
            Use this to downgrade notebooks.
    Choices: any of [1, 2, 3, 4]
    Default: 4
    Equivalent to: [--NotebookExporter.nbformat_version]

Examples
--------

    The simplest way to use nbconvert is

            > jupyter nbconvert mynotebook.ipynb

            which will convert mynotebook.ipynb to the default format (probably HTML).

            You can specify the export format with `--to`.
            Options include ['asciidoc', 'custom', 'html', 'latex', 'markdown', 'notebook', 'pdf', 'python', 'rst', 'script', 'slides'].

            > jupyter nbconvert --to latex mynotebook.ipynb

            Both HTML and LaTeX support multiple output templates. LaTeX includes
            'base', 'article' and 'report'.  HTML includes 'basic' and 'full'. You
            can specify the flavor of the format used.

            > jupyter nbconvert --to html --template basic mynotebook.ipynb

            You can also pipe the output to stdout, rather than a file

            > jupyter nbconvert mynotebook.ipynb --stdout

            PDF is generated via latex

            > jupyter nbconvert mynotebook.ipynb --to pdf

            You can get (and serve) a Reveal.js-powered slideshow

            > jupyter nbconvert myslides.ipynb --to slides --post serve

            Multiple notebooks can be given at the command line in a couple of 
            different ways:

            > jupyter nbconvert notebook*.ipynb
            > jupyter nbconvert notebook1.ipynb notebook2.ipynb

            or you can specify the notebooks list in a config file, containing::

                c.NbConvertApp.notebooks = ["my_notebook.ipynb"]

            > jupyter nbconvert --config mycfg.py

To see all available configurables, use `--help-all`.

Intro:

We, Shani Ben Yitzhak and Ayala Rubinstein, a pair of computer science students, interested in the subject of autism and how it can be identified in toddlers according to daily behavior. We decided to explore the subject in depth as part of the project in the data science course.

The problem

Autistic Spectrum Disorder (ASD) is a neurodevelopmental condition associated with significant healthcare costs, and early diagnosis can significantly reduce these. Unfortunately, waiting times for an ASD diagnosis are lengthy and procedures are not cost effective. The economic impact of autism and the increase in the number of ASD cases across the world reveals an urgent need for the development of easily implemented and effective screening methods. Therefore, a time-efficient and accessible ASD screening is imminent to help health professionals and inform individuals whether they should pursue formal clinical diagnosis.

The rapid growth in the number of ASD cases worldwide necessitates datasets related to behaviour traits. However, such datasets are rare making it difficult to perform thorough analyses to improve the efficiency, sensitivity, specificity and predictive accuracy of the ASD screening process.

Presently, very limited autism datasets associated with clinical or screening are available and most of them are genetic in nature. we found a dataset related to autism screening of toddlers that contained influential features to be utilised for further analysis especially in determining autistic traits and improving the classification of ASD cases. In this dataset, exists ten behavioural features (Q-Chat-10) plus other individuals characteristics that have proved to be effective in detecting the ASD cases from controls in behaviour science.

We would like to find a model with maximum accuracy for predicting ASD in toddlers according to their behavioural features.

Attributes:

A1-A10: Items within Q-Chat-10 in which questions possible answers : “Always, Usually, Sometimes, Rarly & Never” items’ values are mapped to “1” or “0” in the dataset. For questions 1-9 (A1-A9) in Q-chat-10, if the respose was Sometimes / Rarly / Never “1” is assigned to the question (A1-A9). However, for question 10 (A10), if the respose was Always / Usually / Sometimes then “1” is assigned to that question. If the user obtained More than 3 Add points together for all ten questions. If your child scores more than 3 (Q-chat-10- score) then there is a potential ASD traits otherwise no ASD traits are observed.

The remaining features in the datasets are collected from the “submit” screen in the ASDTests screening app. It should be noted that the class varaible was assigned automatically based on the score obtained by the user while undergoing the screening process using the ASDTests app.

Details of variables mapping to the Q-Chat-10 screening methods:

Variable in Dataset Corresponding Q-chat-10-Toddler Features
A1 Does your child look at you when you call his/her name?
A2 How easy is it for you to get eye contact with your child?
A3 Does your child point to indicate that s/he wants something? (e.g. a toy that is out of reach)
A4 Does your child point to share interest with you? (e.g. pointing at an interesting sight)
A5 Does your child pretend? (e.g. care for dolls, talk on a toy phone)
A6 Does your child follow where you’re looking?
A7 If you or someone else in the family is visibly upset, does your child show signs of wanting to comfort them? (e.g. stroking hair, hugging them)
A8 Would you describe your child’s first words as:
A9 Does your child use simple gestures? (e.g. wave goodbye)
A10 Does your child stare at nothing with no apparent purpose?

Features collected and their descriptions:

feature type description
A1: Question 1 Answer The answer code of the question based on the screening method used
A2: Question 2 Answer Number The answer code of the question based on the screening method used
A3: Question 3 Answer Number The answer code of the question based on the screening method used
A4: Question 4 Answer Number The answer code of the question based on the screening method used
A5: Question 5 Answer Number The answer code of the question based on the screening method used
A6: Question 6 Answer Number The answer code of the question based on the screening method used
A7: Question 7 Answer Number The answer code of the question based on the screening method used
A8: Question 8 Answer Number The answer code of the question based on the screening method used
A9: Question 9 Answer Number The answer code of the question based on the screening method used
A10: Question 10 Answer Number The answer code of the question based on the screening method used
Q-chat-10- score Number 1-10 (Less that or equal 3 no ASD traits; > 3 ASD traits
Age_Mons Number Toddler age in months
Sex character male/female
Ethnicity String List of common ethnicities in text format
Jaundice Boolean Whether the case was born with jaundice
Family_mem_with_ASD Boolean Whether any immediate family member has a PDD
Who completed the test String Parent, self, caregiver, medical staff, clinician ,etc.
Class/ASD Traits String ASD traits or No ASD traits (automatically assigned by the ASDTests app). (Yes / No)

Importing libraries

In [3]:
import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
import plotly.express as px


# Dimensionality reduction
from sklearn.decomposition import PCA


# Classification
from sklearn.mixture import GaussianMixture 
from sklearn.cluster import KMeans 
from sklearn.tree import DecisionTreeClassifier
from sklearn.neighbors import KNeighborsClassifier
from sklearn.naive_bayes import GaussianNB
from sklearn.svm import SVC
from sklearn.neural_network import MLPClassifier
from sklearn.ensemble import RandomForestClassifier 
from sklearn.discriminant_analysis import LinearDiscriminantAnalysis 

from xgboost import XGBClassifier

from keras.models import Sequential
from keras.layers import *
from tensorflow.keras.optimizers import Adam, RMSprop
from tensorflow.keras.utils import plot_model


# Regression
from sklearn.linear_model import LogisticRegression


# preprocessing
from sklearn.preprocessing import StandardScaler
from scipy.stats import mode



# Modelling Helpers :
from sklearn.model_selection import train_test_split
from sklearn.model_selection import GridSearchCV
from sklearn.metrics import accuracy_score,classification_report
from sklearn.metrics import plot_confusion_matrix
from sklearn.metrics import confusion_matrix


# shap
!pip install shap &> /dev/null
import shap
shap.initjs()

We will import the data from the drive:

In [4]:
from google.colab import drive
In [5]:
drive.mount('/content/gdrive')
Mounted at /content/gdrive

Eda

let's load our data and examine it:

In [6]:
df = pd.read_csv('/content/gdrive/MyDrive/Toddler Autism dataset July 2018.csv')
df.head()
Out[6]:
Case_No A1 A2 A3 A4 A5 A6 A7 A8 A9 A10 Age_Mons Qchat-10-Score Sex Ethnicity Jaundice Family_mem_with_ASD Who completed the test Class/ASD Traits
0 1 0 0 0 0 0 0 1 1 0 1 28 3 f middle eastern yes no family member No
1 2 1 1 0 0 0 1 1 0 0 0 36 4 m White European yes no family member Yes
2 3 1 0 0 0 0 0 1 1 0 1 36 4 m middle eastern yes no family member Yes
3 4 1 1 1 1 1 1 1 1 1 1 24 10 m Hispanic no no family member Yes
4 5 1 1 0 1 1 1 1 1 1 1 20 9 f White European no yes family member Yes
In [7]:
df = df.drop(['Case_No'], axis = 1)
In [8]:
df.dtypes
Out[8]:
A1                         int64
A2                         int64
A3                         int64
A4                         int64
A5                         int64
A6                         int64
A7                         int64
A8                         int64
A9                         int64
A10                        int64
Age_Mons                   int64
Qchat-10-Score             int64
Sex                       object
Ethnicity                 object
Jaundice                  object
Family_mem_with_ASD       object
Who completed the test    object
Class/ASD Traits          object
dtype: object
In [9]:
df.describe()
Out[9]:
A1 A2 A3 A4 A5 A6 A7 A8 A9 A10 Age_Mons Qchat-10-Score
count 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000 1054.000000
mean 0.563567 0.448767 0.401328 0.512334 0.524668 0.576850 0.649905 0.459203 0.489564 0.586338 27.867173 5.212524
std 0.496178 0.497604 0.490400 0.500085 0.499628 0.494293 0.477226 0.498569 0.500128 0.492723 7.980354 2.907304
min 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 12.000000 0.000000
25% 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 23.000000 3.000000
50% 1.000000 0.000000 0.000000 1.000000 1.000000 1.000000 1.000000 0.000000 0.000000 1.000000 30.000000 5.000000
75% 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 36.000000 8.000000
max 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 36.000000 10.000000

Create new dataFrame with numeric values:

features that have two possible values:

In [10]:
df_numeric = pd.get_dummies(df, columns=[ 'Sex', 'Jaundice', 'Family_mem_with_ASD', 'Class/ASD Traits '], drop_first=True)
df_numeric.head()
Out[10]:
A1 A2 A3 A4 A5 A6 A7 A8 A9 A10 Age_Mons Qchat-10-Score Ethnicity Who completed the test Sex_m Jaundice_yes Family_mem_with_ASD_yes Class/ASD Traits _Yes
0 0 0 0 0 0 0 1 1 0 1 28 3 middle eastern family member 0 1 0 0
1 1 1 0 0 0 1 1 0 0 0 36 4 White European family member 1 1 0 1
2 1 0 0 0 0 0 1 1 0 1 36 4 middle eastern family member 1 1 0 1
3 1 1 1 1 1 1 1 1 1 1 24 10 Hispanic family member 1 0 0 1
4 1 1 0 1 1 1 1 1 1 1 20 9 White European family member 0 0 1 1

features that have more possible values:

In [11]:
cat_features = df_numeric.select_dtypes(include='object')
num_features = df_numeric.select_dtypes(exclude='object')
df_numeric = pd.get_dummies(df_numeric, drop_first=True)
df_numeric.head()
Out[11]:
A1 A2 A3 A4 A5 A6 A7 A8 A9 A10 ... Ethnicity_White European Ethnicity_asian Ethnicity_black Ethnicity_middle eastern Ethnicity_mixed Ethnicity_south asian Who completed the test_Health care professional Who completed the test_Others Who completed the test_Self Who completed the test_family member
0 0 0 0 0 0 0 1 1 0 1 ... 0 0 0 1 0 0 0 0 0 1
1 1 1 0 0 0 1 1 0 0 0 ... 1 0 0 0 0 0 0 0 0 1
2 1 0 0 0 0 0 1 1 0 1 ... 0 0 0 1 0 0 0 0 0 1
3 1 1 1 1 1 1 1 1 1 1 ... 0 0 0 0 0 0 0 0 0 1
4 1 1 0 1 1 1 1 1 1 1 ... 1 0 0 0 0 0 0 0 0 1

5 rows × 30 columns

In [12]:
df_numeric.shape
Out[12]:
(1054, 30)

now we have 30 features.

We would like to separate the cataloging column from the properties space, as well as delete the irrelevant columns:

In [13]:
classes = df_numeric['Class/ASD Traits _Yes']
features = df_numeric.drop(['Class/ASD Traits _Yes', 'Qchat-10-Score'], axis = 1 )

Normalization:

In [14]:
features_norm = StandardScaler().fit_transform(features)

Dimensionality reduction

In [15]:
pca = PCA()
components = pca.fit_transform(features)

Let's see how the components affect the cumulative explained variance

In [16]:
plt.figure(figsize=(8,8))
plt.plot(range(0,28),pca.explained_variance_ratio_.cumsum(), marker='o', linestyle='--')
plt.title('Explainde variance by component')
plt.xlabel('Number Of Component')
plt.ylabel('Cumulative Explained Variance')
Out[16]:
Text(0, 0.5, 'Cumulative Explained Variance')

We see that after one component the accuracy is 0.958 and more components add a lot of information to us.

In [17]:
labels = {
    str(i): f"PC {i+1} ({var:.1f}%)"
    for i, var in enumerate(pca.explained_variance_ratio_ * 100)
}

fig = px.scatter_matrix(
    components,
    labels=labels,
    dimensions=range(5),
    color=df['Class/ASD Traits ']
)
fig.update_traces(diagonal_visible=False,)

It is nice to see visually that for pc1 with pc2 we get the clearest division of the data.

We can see it also in three-dimensional graph of the first 3 components:

In [18]:
pca = PCA(n_components=3)
components = pca.fit_transform(features)

total_var = pca.explained_variance_ratio_.sum() * 100

fig = px.scatter_3d(
    components, x=0, y=1, z=2, color=df['Class/ASD Traits '],
    title=f'Total Explained Variance: {total_var:.2f}%',
    labels={'0': 'PC 1', '1': 'PC 2', '2': 'PC 3'}
)
fig.show()

We have reached a sufficient variance of 96.27%, for 3 components, and the clear division can be seen in this dynamic graph.

Check for each of the PCs to what extent it "explains" the information variance:

In [19]:
pca.explained_variance_
Out[19]:
array([63.71158554,  0.89889163,  0.33832484])

Now we will see how the first components affect the cumulative explained variance

In [20]:
plt.figure()
plt.plot(range(0,3),pca.explained_variance_ratio_.cumsum(), marker='o', linestyle='--')
plt.title('Explainde variance by component')
plt.xlabel('Number Of Component')
plt.ylabel('Cumulative Explained Variance')
Out[20]:
Text(0, 0.5, 'Cumulative Explained Variance')
In [21]:
pca = PCA(2) 
components = pca.fit_transform(features)

Clustering

In [22]:
X = df_numeric.drop(['Class/ASD Traits _Yes'], axis = 1)
Y = df_numeric['Class/ASD Traits _Yes']
In [23]:
components
Out[23]:
array([[-0.12049278,  1.03077403],
       [-8.13762791,  0.17417383],
       [-8.12886118,  0.63766566],
       ...,
       [ 9.87760492, -1.11763654],
       [ 8.85856471,  0.77640764],
       [ 3.84403403, -0.42442151]])
In [24]:
wcss = []
for i in range(1,11):
   model = KMeans(n_clusters = i, init = "k-means++")
   model.fit(features)
   wcss.append(model.inertia_)
plt.figure(figsize=(10,10))
plt.plot(range(1,11), wcss)
plt.xlabel('Number of clusters')
plt.ylabel('WCSS')
plt.show()
In [25]:
model = KMeans(n_clusters = 4, init = "k-means++")
label = model.fit_predict(components)
print(label)
[3 0 0 ... 2 1 1]
In [26]:
plt.figure(figsize=(15,15))
uniq = np.unique(label)
for i in uniq:
  plt.scatter(components[label == i , 0] , components[label == i , 1] , label = i)
  
plt.legend()
plt.show()
In [27]:
model = KMeans(n_clusters = 6, init = "k-means++")
label = model.fit_predict(components)
print(label)
[4 0 0 ... 5 5 1]
In [28]:
plt.figure(figsize=(15,15))
uniq = np.unique(label)
for i in uniq:
  plt.scatter(components[label == i , 0] , components[label == i , 1] , label = i)
  
plt.legend()
plt.show()

Now, the kmeans object contains the cataloging model. (We can send information to the model and get the forecast from it (prediction).

In [29]:
model.cluster_centers_
Out[29]:
array([[-7.87896898,  0.07966602],
       [ 4.23821174, -0.11846985],
       [-3.34787105,  0.0585852 ],
       [14.51278803,  0.20859522],
       [ 0.4341576 , -0.31983196],
       [ 8.78427433, -0.15837695]])
In [30]:
#The data whith the predict:
label
Out[30]:
array([4, 0, 0, ..., 5, 5, 1], dtype=int32)
In [31]:
labels = np.zeros_like(label)
for i in range(10):
 mask = (label == i)
 labels[mask] = mode(Y[mask])[0]

mat = confusion_matrix(Y, labels)
sns.heatmap(mat.T, square=True, annot=True, fmt='d', cbar=False)
plt.xlabel('true label')
plt.ylabel('predicted label');

Data analysis by visualizations

We will examine the ratio in the results of the diagnoses:

In [32]:
sns.countplot(classes, label="count")
/usr/local/lib/python3.7/dist-packages/seaborn/_decorators.py:43: FutureWarning:

Pass the following variable as a keyword arg: x. From version 0.12, the only valid positional argument will be `data`, and passing other arguments without an explicit keyword will result in an error or misinterpretation.

Out[32]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fd0fec589d0>
In [33]:
yes_autism= df[df['Class/ASD Traits ']=='Yes']
no_autism= df[df['Class/ASD Traits ']=='No']

we have 728 toldders diagnosed with ASD and 326 not.

This result can be explained by the fact that the toddlers tested were suspected of ASD, meaning that the sample does not represent the entire population.

In [34]:
corr = df.corr( )
plt.figure(figsize = (15,15))
sns.heatmap(data = corr, annot = True, square = True, cbar = True)
Out[34]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fd0febad3d0>

Interpreting the Heatmap of Correlation

The features in orange colour shows high correlation, we can see all the 10 answers from A1 to A9 except A10 are highly correlated with Qchart10-score. Therefore, it is better to remove the Qchat coloumn from our feature list because it is going to mislead our results.

Let's take a look at the data layout relation to gender:

In [35]:
plt.pie(df["Sex"].value_counts(),labels=('male','female'),explode = [0.1,0],autopct ='%1.1f%%' ,
       startangle = 90,labeldistance = 1.1)
Out[35]:
([<matplotlib.patches.Wedge at 0x7fd0fe916610>,
  <matplotlib.patches.Wedge at 0x7fd0fe916cd0>],
 [Text(-0.976673118620744, -0.6972156189900154, 'male'),
  Text(0.8952836322308103, 0.6391144012300203, 'female')],
 [Text(-0.5697259858621007, -0.4067091110775089, '69.7%'),
  Text(0.48833652667135097, 0.34860785521637466, '30.3%')])
In [36]:
fig = px.bar(df, x="Sex", y='Qchat-10-Score')
fig.show()
In [37]:
ax = sns.countplot(x=classes, hue="Sex", data=df)
In [38]:
yes_autism['Sex'].value_counts()
Out[38]:
m    534
f    194
Name: Sex, dtype: int64
In [39]:
no_autism['Sex'].value_counts()
Out[39]:
m    201
f    125
Name: Sex, dtype: int64

The ratio of girls tested to boys is 0.43, plus the ratio of fossilized girls with autism to boys with autism is 0.36.

It is interesting to see that the prevalence of autism in boys is much greater than the prevalence of autism in girls. Girls seem to be more resistant to the development of the signs of autism - a greater genetic disorder is needed to cause autism in girls than in boys. This may be because the social side of girls is more developed.

Let’s look at the ratio of children diagnosed with jaundice to those who have not.

In [40]:
plt.pie(df["Jaundice"].value_counts(),labels=('no Jaundice','Jaundice'),explode = [0.1,0],autopct ='%1.1f%%' ,
       startangle = 90,labeldistance = 1.1)
Out[40]:
([<matplotlib.patches.Wedge at 0x7fd0fe822650>,
  <matplotlib.patches.Wedge at 0x7fd0fe822d50>],
 [Text(-0.9081759396795839, -0.7843573564307948, 'no Jaundice'),
  Text(0.8324946786900477, 0.7189941654511214, 'Jaundice')],
 [Text(-0.5297692981464238, -0.45754179125129696, '72.7%'),
  Text(0.45408800655820775, 0.3921786357006116, '27.3%')])

Is there a link between jaundice and gender?

In [41]:
ax = sns.countplot(x='Jaundice', hue="Sex", data=df)

Oops, there doesn't seem to be a connection between them.

but could it be that jaundice is an influencing factor for autism?

In [42]:
ax = sns.countplot(x=classes, hue="Jaundice", data=df)
In [43]:
yes_autism['Jaundice'].value_counts()
Out[43]:
no     513
yes    215
Name: Jaundice, dtype: int64
In [44]:
no_autism['Jaundice'].value_counts()
Out[44]:
no     253
yes     73
Name: Jaundice, dtype: int64

Wow! The ratio of the number of toddlres diagnosed with jaundice with autism (215) to those diagnosed without jaundice but with autism (513) is 0.43 and that greater than the ratio of those diagnosed with jaundice without autism (73) to those diagnosed without jaundice and without autism (253) that is 0.32.

This suggests that there is a significant association between jaundice in the newborn and autism in toddlers.

Intetresting to understand if autism is hereditary disease.

In [45]:
ax = sns.countplot(x=classes, hue="Family_mem_with_ASD", data=df)
In [46]:
yes_autism['Family_mem_with_ASD'].value_counts()
Out[46]:
no     613
yes    115
Name: Family_mem_with_ASD, dtype: int64
In [47]:
no_autism['Family_mem_with_ASD'].value_counts()
Out[47]:
no     271
yes     55
Name: Family_mem_with_ASD, dtype: int64

The findings show that autism has a strong genetic basis, meaning that for toddlers with a family member with autism the chance of autism is higher.

let's see how the age inflence.

In [48]:
df[['Age_Mons']].plot(kind='kde');

we can see that the most toddlers that participate were between 30 to 40 monthes.

now lets chek the age range in which toddlers were diagnosed with autism:‏

In [49]:
ax = sns.countplot(x=df['Age_Mons'], hue=classes, data=df)

We will see the ethnicity of toddlers along with age and autism diagnosis results

In [50]:
fig = px.bar(df, x='Ethnicity', y='Age_Mons',
             hover_data=['Class/ASD Traits '], color='Class/ASD Traits ',
             labels={'Qchat-10-Score':'population of Canada'}, height=400)
fig.show()

We will examine the distribution of different values according to violinplot visualization:

examine the distribution of the ten behavioral features of toddlers:

In [51]:
data = pd.concat([classes, features.iloc[:,0:9]], axis=1)
In [52]:
data = pd.melt(data, id_vars="Class/ASD Traits _Yes", var_name="features", value_name='value')
In [53]:
sns.violinplot(x="features", y="value", hue="Class/ASD Traits _Yes", data=data, split=True , inner="quart")
Out[53]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fd0fe54a190>

It can be seen that not all questions can be used as a suitable feature for the catalog, because in order to examine ASD needs an overall view of all the behavioral questions.

We will therefore examine this visualization for Qchat-10-Score.

In [54]:
data = pd.concat([classes, df_numeric.iloc[:,11:12]], axis=1)
In [55]:
data = pd.melt(data, id_vars="Class/ASD Traits _Yes", var_name="df_numeric", value_name='value')
In [56]:
sns.violinplot(x="df_numeric", y="value", hue="Class/ASD Traits _Yes", data=data, split=True , inner="quart")
Out[56]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fd0fe36c310>

In this violin visualization it can be seen that the classification is very accurate, meaning that all toddlers who received a score of 4 or higher are indeed diagnosed with autism.

conclusion

We found interesting findings on autism in toddlers according to the given data. Most toddlers are suspected of being 30-40 months old, probably this is the time when children start communicating normally. Gender genetics and jaundice at birth are also significant causes of the onset of autism in toddlers.

Remove the Qchat-10-Score feature as it has been used to assign the class label so if you keep the score variable the models derived might be overfitted.

In [57]:
df.drop('Qchat-10-Score', axis = 1, inplace = True)
df_numeric=df_numeric.drop(['Qchat-10-Score'], axis = 1)
In [58]:
X = df_numeric.drop(['Class/ASD Traits _Yes'], axis = 1)
Y = df_numeric['Class/ASD Traits _Yes']

Classification

After seeing interesting findings on the data of toddlers suspected of autism, We will move on to look for a model that can by our data predict the maximum accuracy level of autism in toddlers. We will take a number of algorithms from which models are built by our data, and find the most accurate model so that when we run it with data on any toddler who is suspected of autism - we can most accurately predict whether he has ASD.

Division into training data and test data:

In [59]:
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size = 0.40, random_state = 42)

Define functions for the models

In [60]:
#accuracy_score and confusion_matrix
def analysis_of_results(model,name ,X_test,Y_test,Y_hat_test, Y_hat_train):
    plot_confusion_matrix(model, X_test ,Y_test)
    print('\n')
    print(f'test accuracy: {round(accuracy_score(Y_test, Y_hat_test),2)}')
    print(f'train accuracy: {round(accuracy_score(Y_train, Y_hat_train),2)}')
    print('\n')
    print(f'\tConfusion Matrix')
    plt.show()
    print('\n');
    print('Confusion Matrix:\n', pd.crosstab(Y_test, Y_hat_test, rownames=['Actual'], colnames=['Predicted'],margins = True))
    print('\ntest report:\n' + classification_report(Y_test, Y_hat_test))
    print('~'*60)
    print('\ntrain report:\n' + classification_report(Y_train, Y_hat_train))
    print('-'*60)

models:

Logistic regression

Regression is a method of finding a formula (mathematical relationship) that, given a vector of independent variables x, calculates a value for a dependent variable y.

Logistic regression divides the results to 2.

In [61]:
model = LogisticRegression() 
model.fit(X_train,Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'Logistic Regression' ,X_test,Y_test,Y_hat_test, Y_hat_train)

test accuracy: 1.0
train accuracy: 1.0


	Confusion Matrix
/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:818: ConvergenceWarning:

lbfgs failed to converge (status=1):
STOP: TOTAL NO. of ITERATIONS REACHED LIMIT.

Increase the number of iterations (max_iter) or scale the data as shown in:
    https://scikit-learn.org/stable/modules/preprocessing.html
Please also refer to the documentation for alternative solver options:
    https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression

/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          128    1  129
1            1  292  293
All        129  293  422

test report:
              precision    recall  f1-score   support

           0       0.99      0.99      0.99       129
           1       1.00      1.00      1.00       293

    accuracy                           1.00       422
   macro avg       0.99      0.99      0.99       422
weighted avg       1.00      1.00      1.00       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       1.00      1.00      1.00       197
           1       1.00      1.00      1.00       435

    accuracy                           1.00       632
   macro avg       1.00      1.00      1.00       632
weighted avg       1.00      1.00      1.00       632

------------------------------------------------------------

K-nearest neighbours

The algorithm builds a model that examines the Euclidean distance between the new data and data whose catalog is known. And categorizes it as the most common class among the K nearest neighbors.

In [62]:
model = KNeighborsClassifier()
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'K-nearest neighbours' ,X_test,Y_test,Y_hat_test, Y_hat_train)
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


test accuracy: 0.94
train accuracy: 0.95


	Confusion Matrix

Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          120    9  129
1           18  275  293
All        138  284  422

test report:
              precision    recall  f1-score   support

           0       0.87      0.93      0.90       129
           1       0.97      0.94      0.95       293

    accuracy                           0.94       422
   macro avg       0.92      0.93      0.93       422
weighted avg       0.94      0.94      0.94       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       0.90      0.96      0.93       197
           1       0.98      0.95      0.96       435

    accuracy                           0.95       632
   macro avg       0.94      0.95      0.95       632
weighted avg       0.95      0.95      0.95       632

------------------------------------------------------------
In [63]:
shap.initjs()

explainer = shap.KernelExplainer(model.predict_proba, X_train)
shap_values = explainer.shap_values(X_test.iloc[0,:])
shap.force_plot(explainer.expected_value[0], shap_values[0], X_test.iloc[0,:])
/usr/local/lib/python3.7/dist-packages/sklearn/base.py:451: UserWarning:

X does not have valid feature names, but KNeighborsClassifier was fitted with feature names

Using 632 background data samples could cause slower run times. Consider using shap.sample(data, K) or shap.kmeans(data, K) to summarize the background as K samples.
/usr/local/lib/python3.7/dist-packages/sklearn/base.py:451: UserWarning:

X does not have valid feature names, but KNeighborsClassifier was fitted with feature names

/usr/local/lib/python3.7/dist-packages/sklearn/base.py:451: UserWarning:

X does not have valid feature names, but KNeighborsClassifier was fitted with feature names

/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_base.py:138: FutureWarning:

The default of 'normalize' will be set to False in version 1.2 and deprecated in version 1.4.
If you wish to scale the data, use Pipeline with a StandardScaler in a preprocessing stage. To reproduce the previous behavior:

from sklearn.pipeline import make_pipeline

model = make_pipeline(StandardScaler(with_mean=False), LassoLarsIC())

If you wish to pass a sample_weight parameter, you need to pass it as a fit parameter to each step of the pipeline as follows:

kwargs = {s[0] + '__sample_weight': sample_weight for s in model.steps}
model.fit(X, y, **kwargs)

Set parameter alpha to: original_alpha * np.sqrt(n_samples). 

/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_base.py:138: FutureWarning:

The default of 'normalize' will be set to False in version 1.2 and deprecated in version 1.4.
If you wish to scale the data, use Pipeline with a StandardScaler in a preprocessing stage. To reproduce the previous behavior:

from sklearn.pipeline import make_pipeline

model = make_pipeline(StandardScaler(with_mean=False), LassoLarsIC())

If you wish to pass a sample_weight parameter, you need to pass it as a fit parameter to each step of the pipeline as follows:

kwargs = {s[0] + '__sample_weight': sample_weight for s in model.steps}
model.fit(X, y, **kwargs)

Set parameter alpha to: original_alpha * np.sqrt(n_samples). 

Out[63]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

Each of the features contributes to push the model output from the base value (the average model output over the training dataset we passed) towards zero. The features displayed in red are more pushing.

We predicted 0.00, whereas the base_value is 0.3304. Feature values causing increased predictions are in pink, and their visual size shows the magnitude of the feature's effect. Feature values decreasing the prediction are in blue. The biggest impact comes from A4 being 0. Though the A9 = 1 value has a meaningful effect decreasing the prediction.

GaussianNB

A probabilistic data classification method based on the assumption that there is no dependence between the classified objects when their classification is already known. The advantage of this classification method is the ability to expand easily to support a large amount of data.

In [64]:
model = GaussianNB()
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'XGB' ,X_test,Y_test,Y_hat_test, Y_hat_train)
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


test accuracy: 0.57
train accuracy: 0.56


	Confusion Matrix

Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          123    6  129
1          174  119  293
All        297  125  422

test report:
              precision    recall  f1-score   support

           0       0.41      0.95      0.58       129
           1       0.95      0.41      0.57       293

    accuracy                           0.57       422
   macro avg       0.68      0.68      0.57       422
weighted avg       0.79      0.57      0.57       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       0.42      0.99      0.59       197
           1       0.99      0.37      0.54       435

    accuracy                           0.56       632
   macro avg       0.70      0.68      0.56       632
weighted avg       0.81      0.56      0.55       632

------------------------------------------------------------

XGB

XGBoost is a decision tree-based machine learning model that demonstrates superiority over deep machine learning in most cases and as long as the database is not too large and complicated.

In [65]:
model = XGBClassifier()
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'XGB' ,X_test,Y_test,Y_hat_test, Y_hat_train)
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


test accuracy: 0.99
train accuracy: 1.0


	Confusion Matrix

Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          126    3  129
1            3  290  293
All        129  293  422

test report:
              precision    recall  f1-score   support

           0       0.98      0.98      0.98       129
           1       0.99      0.99      0.99       293

    accuracy                           0.99       422
   macro avg       0.98      0.98      0.98       422
weighted avg       0.99      0.99      0.99       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       1.00      0.99      1.00       197
           1       1.00      1.00      1.00       435

    accuracy                           1.00       632
   macro avg       1.00      1.00      1.00       632
weighted avg       1.00      1.00      1.00       632

------------------------------------------------------------
In [66]:
shap.initjs()

explainer = shap.TreeExplainer(model)
shap_values = explainer.shap_values(X)

shap.force_plot(explainer.expected_value, shap_values[0,:], X.iloc[0,:])
Out[66]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

We predicted -0.77, whereas the base_value is 1.101. The biggest impact comes from A8 = 1. Though the A6 value has a meaningful effect decreasing the prediction.

In [67]:
shap.summary_plot(shap_values, X)

The features are ranked according to their importance in descending order, That is, A9 has a highest and most positive effect on autism prediction, and Ethnicity_middle_eastern is negatively correlated with the target variable.

In [68]:
def plot_feature_importances(model):
    n_features = X_train.shape[1]
    plt.figure(figsize=(8, 8))
    plt.barh(range(n_features), model.feature_importances_, align='center')
    plt.yticks(np.arange(n_features), X_train.columns.values)
    plt.xlabel('Feature importance')
    plt.ylabel('Feature')
In [69]:
plot_feature_importances(model)

Support vector machine

A method for analyzing data for classification and regression, where the training examples are represented as vectors in linear space. In the training phase appropriate classifier created in svm - which separates as positively as possible between positive and negative training examples. The classifier is the linear separator which creates as large a space as possible between it and the examples closest to it in the two categories.

In [70]:
model = SVC(kernel = 'rbf', probability = True)
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'SVC' ,X_test,Y_test,Y_hat_test, Y_hat_train)
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


test accuracy: 0.75
train accuracy: 0.74


	Confusion Matrix

Confusion Matrix:
 Predicted   0    1  All
Actual                 
0          25  104  129
1           0  293  293
All        25  397  422

test report:
              precision    recall  f1-score   support

           0       1.00      0.19      0.32       129
           1       0.74      1.00      0.85       293

    accuracy                           0.75       422
   macro avg       0.87      0.60      0.59       422
weighted avg       0.82      0.75      0.69       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       1.00      0.17      0.29       197
           1       0.73      1.00      0.84       435

    accuracy                           0.74       632
   macro avg       0.86      0.59      0.57       632
weighted avg       0.81      0.74      0.67       632

------------------------------------------------------------
In [72]:
%%capture

explainer = shap.KernelExplainer(model.predict_proba, X_train, link="logit")
shap_values = explainer.shap_values(X_test, nsamples=100)
In [73]:
shap.initjs()
shap.force_plot(explainer.expected_value[0], shap_values[0][0,:], X_test.iloc[0,:], link="logit")
Out[73]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

We predicted 0.00, whereas the base_value is 0.3625. The biggest impact comes from who completed the test_self = 0. Though the Age_mons = 33 value has a meaningful effect decreasing the prediction.

In [74]:
shap.initjs()

shap.force_plot(explainer.expected_value[0], shap_values[0], X_test, link="logit")
Out[74]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

Decision Tree

A decision tree is a prediction model that provides a mapping between observations and the values that are appropriate for them. And by the model we can expect for a figure that is not cataloged its target value. It is a solid binary tree consisting of decision nodes in each of which a particular condition is tested on a particular characteristic of the observations and leaves that contain the predicted value for the observation corresponding to the path leading to them in the tree.

In [75]:
model = DecisionTreeClassifier()
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'Decision Tree' ,X_test,Y_test,Y_hat_test, Y_hat_train)
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


test accuracy: 0.91
train accuracy: 1.0


	Confusion Matrix

Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          109   20  129
1           16  277  293
All        125  297  422

test report:
              precision    recall  f1-score   support

           0       0.87      0.84      0.86       129
           1       0.93      0.95      0.94       293

    accuracy                           0.91       422
   macro avg       0.90      0.90      0.90       422
weighted avg       0.91      0.91      0.91       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       1.00      1.00      1.00       197
           1       1.00      1.00      1.00       435

    accuracy                           1.00       632
   macro avg       1.00      1.00      1.00       632
weighted avg       1.00      1.00      1.00       632

------------------------------------------------------------
In [76]:
explainer = shap.TreeExplainer(model)
shap_values = explainer.shap_values(X_train)
shap.summary_plot(shap_values, X_train, plot_type= "bar")

The features are ranked according to their importance in descending order, A5 is most efective, A2 after it etc.

Random Forest

Compared to decision tree, this algorithm helps not to fall for over-fitting, in that it trains an array of decision trees and each tree is responsible for a different part of the information, and finally unifies the information to obtain one final tree.

In [77]:
model = RandomForestClassifier()
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'Decision Tree' ,X_test,Y_test,Y_hat_test, Y_hat_train)

test accuracy: 0.97
train accuracy: 1.0


	Confusion Matrix
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          119   10  129
1            2  291  293
All        121  301  422

test report:
              precision    recall  f1-score   support

           0       0.98      0.92      0.95       129
           1       0.97      0.99      0.98       293

    accuracy                           0.97       422
   macro avg       0.98      0.96      0.97       422
weighted avg       0.97      0.97      0.97       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       1.00      1.00      1.00       197
           1       1.00      1.00      1.00       435

    accuracy                           1.00       632
   macro avg       1.00      1.00      1.00       632
weighted avg       1.00      1.00      1.00       632

------------------------------------------------------------
In [78]:
explainer = shap.TreeExplainer(model)
shap_values = explainer.shap_values(X_train)
shap.summary_plot(shap_values, X_train, plot_type= "bar")

The features are ranked according to their importance in descending order, A6 is most efective, A5 after it etc.

Linear Discriminant Analysis

It is a statistical technique that creates a function capable of classifying phenomena, taking into account a series of discriminatory variables and the probability of belonging.

In [79]:
model = LinearDiscriminantAnalysis()
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'Linear Discriminant Analysis' ,X_test,Y_test,Y_hat_test, Y_hat_train)
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


test accuracy: 0.96
train accuracy: 0.97


	Confusion Matrix

Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          126    3  129
1           15  278  293
All        141  281  422

test report:
              precision    recall  f1-score   support

           0       0.89      0.98      0.93       129
           1       0.99      0.95      0.97       293

    accuracy                           0.96       422
   macro avg       0.94      0.96      0.95       422
weighted avg       0.96      0.96      0.96       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       0.92      0.99      0.95       197
           1       1.00      0.96      0.98       435

    accuracy                           0.97       632
   macro avg       0.96      0.97      0.96       632
weighted avg       0.97      0.97      0.97       632

------------------------------------------------------------

Accuracy Summary

In [80]:
models = ["Logistic regression", "K-nearest neighbours", "GaussianNB", "XGB",
          "Support vector machine", "Decision Tree", "Random Forest", "Linear Discriminant Analysis"]
test_accuracy = [1, 0.94, 0.57, 0.99, 0.75, 0.92, 0.97, 0.96]
train_accuracy = [1, 0.95, 0.56, 1, 0.74, 1, 1, 0.97]
accuracy_summary = pd.DataFrame([models,  test_accuracy, train_accuracy]).T 
accuracy_summary.columns = ["Classifier", "test_accuracy", "train_accuracy"] 
accuracy_summary
Out[80]:
Classifier test_accuracy train_accuracy
0 Logistic regression 1 1
1 K-nearest neighbours 0.94 0.95
2 GaussianNB 0.57 0.56
3 XGB 0.99 1
4 Support vector machine 0.75 0.74
5 Decision Tree 0.92 1
6 Random Forest 0.97 1
7 Linear Discriminant Analysis 0.96 0.97

Logistic Regression model training and testing accuracy are 100%, Recall, Precision, F1score are 100% for both yes autism and no autism patients, False positive, and True negative= 2. Logistic Regression with default parameters is the best model.

KNN model has test accuracy: 0.94,train accuracy: 0.95, and weight average Recall = 94%. so KNN is a good model.

GaussianNB has low accuracy: 0.57 so it is rejected.

XG Boosting has test accuracy: 0.99, train accuracy: 1.0, and Weighted average Recall for positive cases = 99%, So XG Boosting is an ideal model ranked number 2 after Logistic Regression.

The support vector machine has low accuracy: 0.75 so it is rejected.

The decision tree model has test accuracy: 0.92 and train accuracy: 1. This model has to be hyper tuned to adjust the overfitting, so it is not accepted.

Random forest model has test accuracy: 0.97 and train accuracy: 1.so it is also a good model

Linear Discriminant Analysis has training accuracy: 0.96 and testing accuracy:0.97 and Recall accuracy= 96% So LinearDiscriminantAnalysis is also a good model.

Now, We will try to hyper tune support vector machine using grid search without pipeline, then we will hyper tune random forest using grid-search with the pipeline, additionally, we will model with a neural network through sikit learn, and Keras library.

Tuning hyperparameters for SVC

In [81]:
model = SVC()

params = {
    'C': [0.1,0.8,0.9,1,1.1,1.2,1.3,1.4],
    'kernel':['linear', 'rbf'],
    'gamma' :[0.1,0.8,0.9,1,1.1,1.2,1.3,1.4]
}

clf = GridSearchCV(model, param_grid = params, scoring = 'accuracy', cv = 10, verbose = 2)

clf.fit(X_train, Y_train)
Fitting 10 folds for each of 128 candidates, totalling 1280 fits
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.8, gamma=1, kernel=linear; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.8, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.8, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.8, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=0.9, gamma=1, kernel=linear; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=0.9, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=0.9, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=0.9, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ........................C=1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ...........................C=1, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ......................C=1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .........................C=1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.8, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.1, gamma=1, kernel=linear; total time=   0.0s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.2s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.2s
[CV] END .........................C=1.1, gamma=1, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.1, gamma=1.1, kernel=rbf; total time=   0.2s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.1, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.1, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.1, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.2s
[CV] END .......................C=1.2, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.1s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.1s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.2, gamma=1, kernel=linear; total time=   0.0s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.2, gamma=1, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.1s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.2, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.2, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=0.9, kernel=rbf; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.3, gamma=1, kernel=linear; total time=   0.0s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.0s
[CV] END .........................C=1.3, gamma=1, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.1, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.3, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.3, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.3, gamma=1.4, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.1, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.8, kernel=linear; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.8, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=0.9, kernel=linear; total time=   0.0s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=0.9, kernel=rbf; total time=   0.1s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END ......................C=1.4, gamma=1, kernel=linear; total time=   0.0s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END .........................C=1.4, gamma=1, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.1, kernel=linear; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.1, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.2, kernel=linear; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.2, kernel=rbf; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.3, kernel=linear; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.3, kernel=rbf; total time=   0.1s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END ....................C=1.4, gamma=1.4, kernel=linear; total time=   0.0s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
[CV] END .......................C=1.4, gamma=1.4, kernel=rbf; total time=   0.1s
Out[81]:
GridSearchCV(cv=10, estimator=SVC(),
             param_grid={'C': [0.1, 0.8, 0.9, 1, 1.1, 1.2, 1.3, 1.4],
                         'gamma': [0.1, 0.8, 0.9, 1, 1.1, 1.2, 1.3, 1.4],
                         'kernel': ['linear', 'rbf']},
             scoring='accuracy', verbose=2)
In [82]:
clf.best_params_
Out[82]:
{'C': 0.8, 'gamma': 0.1, 'kernel': 'linear'}

SVC after hypertunned

In [83]:
model = SVC(C = 0.8, gamma = 0.1, kernel = 'linear', probability = True)
model.fit(X_train, Y_train)
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)
analysis_of_results(model,'SVC' ,X_test,Y_test,Y_hat_test, Y_hat_train)
/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


test accuracy: 1.0
train accuracy: 1.0


	Confusion Matrix

Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          129    0  129
1            0  293  293
All        129  293  422

test report:
              precision    recall  f1-score   support

           0       1.00      1.00      1.00       129
           1       1.00      1.00      1.00       293

    accuracy                           1.00       422
   macro avg       1.00      1.00      1.00       422
weighted avg       1.00      1.00      1.00       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       1.00      1.00      1.00       197
           1       1.00      1.00      1.00       435

    accuracy                           1.00       632
   macro avg       1.00      1.00      1.00       632
weighted avg       1.00      1.00      1.00       632

------------------------------------------------------------
In [84]:
%%capture
explainer = shap.KernelExplainer(model.predict_proba, X_train, link="logit")
shap_values = explainer.shap_values(X_test, nsamples=100)
In [85]:
shap.initjs()
shap.force_plot(explainer.expected_value[0], shap_values[0][0,:], X_test.iloc[0,:], link="logit")
Out[85]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

We predicted 0.00, whereas the base_value is 0.3124. the A7 = 1 value has a meaningful effect decreasing the prediction, A2 = 1 is the next, etc.

In [86]:
shap.initjs()

shap.force_plot(explainer.expected_value[0], shap_values[0], X_test, link="logit")
Out[86]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

Neural Network Classifier Using SKLearn

In [87]:
model = MLPClassifier(activation='tanh', solver='lbfgs',learning_rate='constant',
                                   early_stopping= False, alpha=0.0001 ,hidden_layer_sizes=(100, 3),random_state=33)
model.fit(X_train, Y_train)

Y_hat_test = model.predict(X_test)
Y_hat_train = model.predict(X_train)

analysis_of_results(model,'MLPClassifier' ,X_test,Y_test,Y_hat_test, Y_hat_train)

test accuracy: 0.98
train accuracy: 1.0


	Confusion Matrix
/usr/local/lib/python3.7/dist-packages/sklearn/neural_network/_multilayer_perceptron.py:549: ConvergenceWarning:

lbfgs failed to converge (status=1):
STOP: TOTAL NO. of ITERATIONS REACHED LIMIT.

Increase the number of iterations (max_iter) or scale the data as shown in:
    https://scikit-learn.org/stable/modules/preprocessing.html

/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning:

Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.


Confusion Matrix:
 Predicted    0    1  All
Actual                  
0          125    4  129
1            3  290  293
All        128  294  422

test report:
              precision    recall  f1-score   support

           0       0.98      0.97      0.97       129
           1       0.99      0.99      0.99       293

    accuracy                           0.98       422
   macro avg       0.98      0.98      0.98       422
weighted avg       0.98      0.98      0.98       422

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

train report:
              precision    recall  f1-score   support

           0       1.00      1.00      1.00       197
           1       1.00      1.00      1.00       435

    accuracy                           1.00       632
   macro avg       1.00      1.00      1.00       632
weighted avg       1.00      1.00      1.00       632

------------------------------------------------------------

Neural Network Classifier Using Keras

In [88]:
model = Sequential()
model.add(Dense(100, input_dim = 28, activation='relu'))

model.add(Dense(activation = 'sigmoid', units = 1))


model.compile(loss='binary_crossentropy',
             optimizer = Adam(lr=0.0001, decay=1e-5),
              metrics=['acc'])

model.fit(X_train, Y_train, epochs=100, validation_data=(X_test, Y_test))
/usr/local/lib/python3.7/dist-packages/keras/optimizer_v2/adam.py:105: UserWarning:

The `lr` argument is deprecated, use `learning_rate` instead.

Epoch 1/100
20/20 [==============================] - 1s 13ms/step - loss: 0.6577 - acc: 0.6883 - val_loss: 0.6078 - val_acc: 0.6943
Epoch 2/100
20/20 [==============================] - 0s 4ms/step - loss: 0.6004 - acc: 0.6883 - val_loss: 0.5695 - val_acc: 0.6943
Epoch 3/100
20/20 [==============================] - 0s 4ms/step - loss: 0.5703 - acc: 0.6883 - val_loss: 0.5531 - val_acc: 0.6967
Epoch 4/100
20/20 [==============================] - 0s 4ms/step - loss: 0.5552 - acc: 0.6883 - val_loss: 0.5391 - val_acc: 0.6967
Epoch 5/100
20/20 [==============================] - 0s 4ms/step - loss: 0.5415 - acc: 0.6883 - val_loss: 0.5259 - val_acc: 0.6967
Epoch 6/100
20/20 [==============================] - 0s 4ms/step - loss: 0.5287 - acc: 0.6915 - val_loss: 0.5130 - val_acc: 0.6991
Epoch 7/100
20/20 [==============================] - 0s 4ms/step - loss: 0.5162 - acc: 0.6994 - val_loss: 0.5007 - val_acc: 0.7062
Epoch 8/100
20/20 [==============================] - 0s 4ms/step - loss: 0.5044 - acc: 0.7073 - val_loss: 0.4889 - val_acc: 0.7109
Epoch 9/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4938 - acc: 0.7104 - val_loss: 0.4779 - val_acc: 0.7227
Epoch 10/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4830 - acc: 0.7152 - val_loss: 0.4674 - val_acc: 0.7299
Epoch 11/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4732 - acc: 0.7215 - val_loss: 0.4578 - val_acc: 0.7370
Epoch 12/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4641 - acc: 0.7294 - val_loss: 0.4485 - val_acc: 0.7370
Epoch 13/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4547 - acc: 0.7453 - val_loss: 0.4398 - val_acc: 0.7630
Epoch 14/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4464 - acc: 0.7690 - val_loss: 0.4314 - val_acc: 0.7844
Epoch 15/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4385 - acc: 0.7547 - val_loss: 0.4231 - val_acc: 0.7796
Epoch 16/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4298 - acc: 0.7801 - val_loss: 0.4149 - val_acc: 0.8081
Epoch 17/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4215 - acc: 0.8054 - val_loss: 0.4063 - val_acc: 0.8223
Epoch 18/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4124 - acc: 0.8038 - val_loss: 0.3972 - val_acc: 0.8246
Epoch 19/100
20/20 [==============================] - 0s 4ms/step - loss: 0.4034 - acc: 0.8149 - val_loss: 0.3891 - val_acc: 0.8436
Epoch 20/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3954 - acc: 0.8307 - val_loss: 0.3814 - val_acc: 0.8436
Epoch 21/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3884 - acc: 0.8323 - val_loss: 0.3743 - val_acc: 0.8697
Epoch 22/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3807 - acc: 0.8623 - val_loss: 0.3674 - val_acc: 0.8720
Epoch 23/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3744 - acc: 0.8671 - val_loss: 0.3607 - val_acc: 0.8744
Epoch 24/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3671 - acc: 0.8734 - val_loss: 0.3543 - val_acc: 0.8815
Epoch 25/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3608 - acc: 0.8766 - val_loss: 0.3481 - val_acc: 0.8839
Epoch 26/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3548 - acc: 0.8908 - val_loss: 0.3421 - val_acc: 0.8934
Epoch 27/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3486 - acc: 0.8924 - val_loss: 0.3364 - val_acc: 0.8934
Epoch 28/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3434 - acc: 0.8972 - val_loss: 0.3311 - val_acc: 0.8934
Epoch 29/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3372 - acc: 0.9019 - val_loss: 0.3257 - val_acc: 0.8957
Epoch 30/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3320 - acc: 0.9019 - val_loss: 0.3207 - val_acc: 0.9005
Epoch 31/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3265 - acc: 0.9098 - val_loss: 0.3159 - val_acc: 0.9005
Epoch 32/100
20/20 [==============================] - 0s 7ms/step - loss: 0.3213 - acc: 0.9130 - val_loss: 0.3110 - val_acc: 0.9005
Epoch 33/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3162 - acc: 0.9161 - val_loss: 0.3062 - val_acc: 0.9052
Epoch 34/100
20/20 [==============================] - 0s 4ms/step - loss: 0.3107 - acc: 0.9272 - val_loss: 0.3013 - val_acc: 0.9194
Epoch 35/100
20/20 [==============================] - 0s 5ms/step - loss: 0.3050 - acc: 0.9209 - val_loss: 0.2956 - val_acc: 0.9123
Epoch 36/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2987 - acc: 0.9193 - val_loss: 0.2900 - val_acc: 0.9194
Epoch 37/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2931 - acc: 0.9383 - val_loss: 0.2848 - val_acc: 0.9194
Epoch 38/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2876 - acc: 0.9272 - val_loss: 0.2801 - val_acc: 0.9171
Epoch 39/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2834 - acc: 0.9320 - val_loss: 0.2756 - val_acc: 0.9218
Epoch 40/100
20/20 [==============================] - 0s 6ms/step - loss: 0.2776 - acc: 0.9272 - val_loss: 0.2712 - val_acc: 0.9218
Epoch 41/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2734 - acc: 0.9304 - val_loss: 0.2672 - val_acc: 0.9218
Epoch 42/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2689 - acc: 0.9320 - val_loss: 0.2633 - val_acc: 0.9242
Epoch 43/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2648 - acc: 0.9383 - val_loss: 0.2596 - val_acc: 0.9242
Epoch 44/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2609 - acc: 0.9430 - val_loss: 0.2559 - val_acc: 0.9242
Epoch 45/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2576 - acc: 0.9430 - val_loss: 0.2525 - val_acc: 0.9242
Epoch 46/100
20/20 [==============================] - 0s 5ms/step - loss: 0.2537 - acc: 0.9351 - val_loss: 0.2493 - val_acc: 0.9242
Epoch 47/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2505 - acc: 0.9478 - val_loss: 0.2461 - val_acc: 0.9265
Epoch 48/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2465 - acc: 0.9430 - val_loss: 0.2428 - val_acc: 0.9242
Epoch 49/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2437 - acc: 0.9494 - val_loss: 0.2398 - val_acc: 0.9218
Epoch 50/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2400 - acc: 0.9446 - val_loss: 0.2370 - val_acc: 0.9242
Epoch 51/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2371 - acc: 0.9446 - val_loss: 0.2342 - val_acc: 0.9289
Epoch 52/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2340 - acc: 0.9525 - val_loss: 0.2315 - val_acc: 0.9289
Epoch 53/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2310 - acc: 0.9525 - val_loss: 0.2287 - val_acc: 0.9313
Epoch 54/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2283 - acc: 0.9494 - val_loss: 0.2261 - val_acc: 0.9313
Epoch 55/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2256 - acc: 0.9509 - val_loss: 0.2236 - val_acc: 0.9313
Epoch 56/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2243 - acc: 0.9541 - val_loss: 0.2214 - val_acc: 0.9289
Epoch 57/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2202 - acc: 0.9573 - val_loss: 0.2189 - val_acc: 0.9313
Epoch 58/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2179 - acc: 0.9509 - val_loss: 0.2166 - val_acc: 0.9289
Epoch 59/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2153 - acc: 0.9589 - val_loss: 0.2143 - val_acc: 0.9313
Epoch 60/100
20/20 [==============================] - 0s 6ms/step - loss: 0.2129 - acc: 0.9604 - val_loss: 0.2121 - val_acc: 0.9289
Epoch 61/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2103 - acc: 0.9573 - val_loss: 0.2100 - val_acc: 0.9289
Epoch 62/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2095 - acc: 0.9573 - val_loss: 0.2078 - val_acc: 0.9289
Epoch 63/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2069 - acc: 0.9557 - val_loss: 0.2061 - val_acc: 0.9313
Epoch 64/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2038 - acc: 0.9604 - val_loss: 0.2040 - val_acc: 0.9289
Epoch 65/100
20/20 [==============================] - 0s 4ms/step - loss: 0.2020 - acc: 0.9573 - val_loss: 0.2020 - val_acc: 0.9313
Epoch 66/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1994 - acc: 0.9604 - val_loss: 0.2003 - val_acc: 0.9265
Epoch 67/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1975 - acc: 0.9684 - val_loss: 0.1983 - val_acc: 0.9289
Epoch 68/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1957 - acc: 0.9589 - val_loss: 0.1966 - val_acc: 0.9313
Epoch 69/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1935 - acc: 0.9604 - val_loss: 0.1947 - val_acc: 0.9289
Epoch 70/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1917 - acc: 0.9604 - val_loss: 0.1931 - val_acc: 0.9289
Epoch 71/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1898 - acc: 0.9684 - val_loss: 0.1914 - val_acc: 0.9313
Epoch 72/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1881 - acc: 0.9684 - val_loss: 0.1897 - val_acc: 0.9313
Epoch 73/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1862 - acc: 0.9636 - val_loss: 0.1881 - val_acc: 0.9313
Epoch 74/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1847 - acc: 0.9684 - val_loss: 0.1865 - val_acc: 0.9313
Epoch 75/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1830 - acc: 0.9636 - val_loss: 0.1849 - val_acc: 0.9313
Epoch 76/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1811 - acc: 0.9652 - val_loss: 0.1834 - val_acc: 0.9336
Epoch 77/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1796 - acc: 0.9684 - val_loss: 0.1819 - val_acc: 0.9336
Epoch 78/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1780 - acc: 0.9636 - val_loss: 0.1805 - val_acc: 0.9336
Epoch 79/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1761 - acc: 0.9684 - val_loss: 0.1791 - val_acc: 0.9360
Epoch 80/100
20/20 [==============================] - 0s 6ms/step - loss: 0.1747 - acc: 0.9684 - val_loss: 0.1777 - val_acc: 0.9360
Epoch 81/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1732 - acc: 0.9684 - val_loss: 0.1763 - val_acc: 0.9336
Epoch 82/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1717 - acc: 0.9684 - val_loss: 0.1750 - val_acc: 0.9336
Epoch 83/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1700 - acc: 0.9668 - val_loss: 0.1737 - val_acc: 0.9360
Epoch 84/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1687 - acc: 0.9715 - val_loss: 0.1724 - val_acc: 0.9360
Epoch 85/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1680 - acc: 0.9699 - val_loss: 0.1712 - val_acc: 0.9408
Epoch 86/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1664 - acc: 0.9652 - val_loss: 0.1699 - val_acc: 0.9336
Epoch 87/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1645 - acc: 0.9715 - val_loss: 0.1687 - val_acc: 0.9384
Epoch 88/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1632 - acc: 0.9699 - val_loss: 0.1676 - val_acc: 0.9408
Epoch 89/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1620 - acc: 0.9715 - val_loss: 0.1663 - val_acc: 0.9408
Epoch 90/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1605 - acc: 0.9715 - val_loss: 0.1652 - val_acc: 0.9408
Epoch 91/100
20/20 [==============================] - 0s 6ms/step - loss: 0.1594 - acc: 0.9715 - val_loss: 0.1640 - val_acc: 0.9360
Epoch 92/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1582 - acc: 0.9715 - val_loss: 0.1629 - val_acc: 0.9431
Epoch 93/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1568 - acc: 0.9715 - val_loss: 0.1616 - val_acc: 0.9431
Epoch 94/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1556 - acc: 0.9699 - val_loss: 0.1603 - val_acc: 0.9408
Epoch 95/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1549 - acc: 0.9731 - val_loss: 0.1590 - val_acc: 0.9431
Epoch 96/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1528 - acc: 0.9715 - val_loss: 0.1578 - val_acc: 0.9431
Epoch 97/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1516 - acc: 0.9747 - val_loss: 0.1567 - val_acc: 0.9431
Epoch 98/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1502 - acc: 0.9747 - val_loss: 0.1557 - val_acc: 0.9455
Epoch 99/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1490 - acc: 0.9747 - val_loss: 0.1546 - val_acc: 0.9455
Epoch 100/100
20/20 [==============================] - 0s 4ms/step - loss: 0.1482 - acc: 0.9684 - val_loss: 0.1536 - val_acc: 0.9455
Out[88]:
<keras.callbacks.History at 0x7fd0fa270bd0>
In [89]:
model.summary()
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 100)               2900      
                                                                 
 dense_1 (Dense)             (None, 1)                 101       
                                                                 
=================================================================
Total params: 3,001
Trainable params: 3,001
Non-trainable params: 0
_________________________________________________________________
In [90]:
plot_model(model)
Out[90]:
In [91]:
Y_hat_test = model.predict(X_test).astype(int)
Y_hat_train = model.predict(X_train).astype(int)

print(classification_report(Y_test, Y_hat_test))
print("kerasNN_test_acc:",round(accuracy_score(Y_test, Y_hat_test), 2))
print("kerasNN_train_acc:" ,round(accuracy_score(Y_train, Y_hat_train), 2))
              precision    recall  f1-score   support

           0       0.31      1.00      0.47       129
           1       0.00      0.00      0.00       293

    accuracy                           0.31       422
   macro avg       0.15      0.50      0.23       422
weighted avg       0.09      0.31      0.14       422

kerasNN_test_acc: 0.31
kerasNN_train_acc: 0.31
/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning:

Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.

/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning:

Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.

/usr/local/lib/python3.7/dist-packages/sklearn/metrics/_classification.py:1318: UndefinedMetricWarning:

Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.

Final Accuracy Summary

In [92]:
models = ["Logistic regression", "K-nearest neighbours", "GaussianNB", "XGB",
          "Support vector machine", "Decision Tree", "Random Forest", "Linear Discriminant Analysis",
          "SVC after hypertunned", "Neural Network with SKLearn", "Neural Network with Keras"]
test_accuracy = [1, 0.94, 0.57, 0.99, 0.75, 0.92, 0.97, 0.96, 1, 0.98, 0.31]
train_accuracy = [1, 0.95, 0.56, 1, 0.74, 1, 1, 0.97, 1, 1, 0.31]
accuracy_summary = pd.DataFrame([models,  test_accuracy, train_accuracy]).T 
accuracy_summary.columns = ["Classifier", "test_accuracy", "train_accuracy"] 
accuracy_summary
Out[92]:
Classifier test_accuracy train_accuracy
0 Logistic regression 1 1
1 K-nearest neighbours 0.94 0.95
2 GaussianNB 0.57 0.56
3 XGB 0.99 1
4 Support vector machine 0.75 0.74
5 Decision Tree 0.92 1
6 Random Forest 0.97 1
7 Linear Discriminant Analysis 0.96 0.97
8 SVC after hypertunned 1 1
9 Neural Network with SKLearn 0.98 1
10 Neural Network with Keras 0.31 0.31

Logistic Regression

Training and testing accuracy are 100%, Recall, Precision, F1score are 100% for both yes autism and no autism patients, False positive, and True negative = 2. Logistic Regression with default parameters is almost the best model.


K-nearest neighbours

Test accuracy: 0.94,train accuracy: 0.95, and weight average Recall = 94%. so KNN is a good model.


GaussianNB

Low accuracy: 0.57 so it is rejected.


XG Boosting

Test accuracy: 0.99, train accuracy: 1.0, and Weighted average Recall for positive cases =99%, So XG Boosting is an ideal model ranked number 3 after Logistic Regression.


Support vector machine

Low accuracy: 0.75 so it is rejected.


Decision tree

Test accuracy: 0.92 and train accuracy: 1. This model has to be hyper tuned to adjust the overfitting, so it is not accepted.


Random forest

Test accuracy: 0.97 and train accuracy: 1.so it is also a good model.


Linear Discriminant Analysis

Training accuracy: 0.96 and testing accuracy:0.97 and Recall accuracy= 96% So LinearDiscriminantAnalysis is also a good model.


Support Vector Machine after hypertunned

Through grid search the training and testing accuracy are 100%, Recall, Precision, F1score are 100% for both yes autism and no autism patients, False positive, and True negative = 0. Support Vector Machine after hypertunned is the best model.


Neural Network Classifier Using SKLearn

Test accuracy: 0.98 and train accuracy: 1.0, and Weighted average Recall = 98%,

so Neural Network Classifier Using SKLearn is an ideal model.


Neural Network Classifier Using Keras

Lowest accuracy: 0.31 so it is rejected.

So the final rating of the best models is:

  1. Support Vector Machine after hypertunned.
  2. Logistic Regression.
  3. Neural Network Classifier Using SKLearn.
  4. XG Boosting.
  5. Random forest.
  6. Linear Discriminant Analysis.
  7. K-nearest neighbours.
  8. Decision tree.

conclusion

We have seen models of machine learning from sklearn, xgboot, keras libraries, most of them yielded good results but support vector machine brought better results of all after the accuracy of the parameters. So when we want to predict autism in a toddler we can run this model and trust that we will get the most accurate results, really 100% accuracy. So we can say that we have solved the problem of long and unprofitable waiting times for clinical diagnoses, and now for the toddler suspected of autism it is possible according to his behavioral traits to perform a quick test with our model and arrive at the most accurate answer.